Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available June 29, 2026
-
Quantum image processing (QIP) is an emerging field that integrates image processing with the principles of quantum computing (QC). As quantum technologies advance, researchers face new opportunities and challenges in developing efficient QIP techniques. This paper provides an overview of quantum image representations, with a focus on two prominent encoding schemes: Novel Enhanced Quantum Representation (NEQR) and Fourier-based Quantum Image Representation (FRQI). We compare their performance in noisy quantum environments by evaluating qubit requirements, image quality, and computational efficiency. The study further analyzes the impact of quantum gate errors and qubit limitations on image reconstruction fidelity. We also compare GPU and QPU performance to highlight their strengths and weaknesses. Our findings stress the importance of error mitigation, advancements in quantum hardware, and the advancements of quantum-classical hybrid systems to drive future progress in QIP.more » « lessFree, publicly-accessible full text available June 29, 2026
-
Free, publicly-accessible full text available May 4, 2026
-
Free, publicly-accessible full text available August 6, 2026
-
Free, publicly-accessible full text available June 22, 2026
-
Stochastic computing (SC) is a reemerging computing paradigm that oers low-cost and noise-resilient hardware designs for a variety of arithmetic functions. In SC, circuits operate on uniform bit-streams, where the value is encoded by the probability of observing ‘1’s in the stream. The accuracy of SC operations highly depends on the correlation between input bit-streams. Some operations, such as minimum and maximum, require highly correlated inputs, whereas others like multiplication demand uncorrelated or statistically independent inputs for accurate results. Developing low-cost and accurate correlation manipulation circuits is critical, as they allow correlation management without incurring the high cost of bit-stream regeneration. This work introduces novel in-stream correlator and decorrelator circuits capable of: 1) adjusting correlation between stochastic bit-streams and 2) controlling the distribution of ‘1’s in the output bit-streams. Compared to state-of-the-art (SoA) approaches, our designs oer improved accuracy and reduced hardware overhead. The output bit-streams enjoy low-discrepancy (LD) distribution, leading to higher quality of results. To further increase the accuracy when dealing with pseudo-random inputs, we propose an enhancement module that balances the number of ‘1’s across adjacent input segments. We show the eectiveness of the proposed techniques through two application case studies: SC design of sorting and median filtering.more » « lessFree, publicly-accessible full text available November 1, 2026
-
Low-cost and hardware-efficient design of trigonometric functions is challenging. Stochastic computing (SC), an emerging computing model processing random bit-streams, offers promising solutions for this problem. The existing implementations, however, often overlook the importance of the data converters necessary to generate the needed bit-streams. While recent advancements in SC bit-stream generators focus on basic arithmetic operations such as multiplication and addition, energy-efficient SC design of non-linear functions demands attention to both the computation circuit and the bit-stream generator. This work introduces TriSC, a novel approach for SC-based design of trigonometric functions enjoying state-of-the-art (SOTA) quasi-random bit-streams. Unlike SOTA SC designs of trigonometric functions that heavily rely on delay elements to decorrelate bit-streams, our approach avoids delay elements while improving the accuracy of the results. TriSC yields significant energy savings of up to 92% compared to SOTA. As two novel use cases studied for the first time in SC literature, we employ the proposed design for 2D image transformation and forward kinematics of a robotic arm, two computation-intensive applications demanding low-cost trigonometric designs.more » « less
-
Hyperdimensional computing (HDC) is an emerging computing paradigm with significant promise for efficient and robust learning. In HDC, objects are encoded with high-dimensional vector symbolic sequences called hypervectors. The quality of hypervectors, defined by their distribution and independence, directly impacts the performance of HDC systems. Despite a large body of work on the processing parts of HDC systems, little to no attention has been paid to data encoding and the quality of hypervectors. Most prior studies have generated hypervectors using inherent random functions, such as MATLAB’s or Python’s random function. This work introduces an optimization technique for generating hypervectors by employing quasi-random sequences. These sequences have recently demonstrated their effectiveness in achieving accurate and low-discrepancy data encoding in stochastic computing systems. The study outlines the optimization steps for utilizing Sobol sequences to produce highquality hypervectors in HDC systems. An optimization algorithm is proposed to select the most suitable Sobol sequences via indexes for generating minimally correlated hypervectors, particularly in applications related to symbol-oriented architectures. The performance of the proposed technique is evaluated in comparison to two traditional approaches of generating hypervectors based on linear-feedback shift registers and MATLAB random functions. The evaluation is conducted for three applications: (i) language, (ii) headline, and (iii) medical image classification. Our experimental results demonstrate accuracy improvements of up to 10.79%, depending on the vector size. Additionally, the proposed encoding hardware exhibits reduced energy consumption and a superior area-delay product.more » « less
-
Precise seizure identification plays a vital role in understanding cortical connectivity and informing treatment decisions. Yet, the manual diagnostic methods for epileptic seizures are both labor-intensive and highly specialized. In this study, we propose a Hyperdimensional Computing (HDC) classifier for accurate and efficient multi-type seizure classification. Despite previous seizure analysis efforts using HDC being limited to binary detection (seizure or no seizure), our work breaks new ground by utilizing HDC to classify seizures into multiple distinct types. HDC offers significant advantages, such as lower memory requirements, a reduced hardware footprint for wearable devices, and decreased computational complexity. Due to these attributes, HDC can be an alternative to traditional machine learning methods, making it a practical and efficient solution, particularly in resource-limited scenarios or applications involving wearable devices. We evaluated the proposed technique on the latest version of TUH EEG Seizure Corpus (TUSZ) dataset and the evaluation result demonstrate noteworthy performance, achieving a weighted F1 score of 94.6%. This outcome is in line with, or even exceeds, the performance achieved by the state-ofthe-art traditional machine learning methods.more » « less
-
Free, publicly-accessible full text available August 6, 2026
An official website of the United States government

Full Text Available